LP Relaxations of Some NP-Hard Problems Are as Hard as Any LP

نویسندگان

  • Daniel Prusa
  • Tomás Werner
چکیده

We show that solving linear programming (LP) relaxations of many classical NP-hard combinatorial optimization problems is as hard as solving the general LP problem. Precisely, the general LP can be reduced in linear time to the LP relaxation of each of these problems. This result poses a fundamental limitation for designing efficient algorithms to solve the LP relaxations, because finding such an algorithm might improve the complexity of best known algorithms for the general LP. Besides linear-time reductions, we show that the LP relaxations of the considered problems are P-complete under log-space reduction, therefore also hard to parallelize.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounding the Integrality Distance of LP Relaxations for Structured Prediction

In structured prediction, a predictor optimizes an objective function over a combinatorial search space, such as the set of all image segmentations, or the set of all part-of-speech taggings. Unfortunately, finding the optimal structured labeling—sometimes referred to as maximum a posteriori (MAP) inference—is, in general, NP-hard [12], due to the combinatorial structure of the problem. Many in...

متن کامل

No Small Linear Program Approximates Vertex Cover within a Factor 2-ε

The vertex cover problem is one of the most important and intensively studied combinatorial optimization problems. Khot and Regev [30, 31] proved that the problem is NP-hard to approximate within a factor 2 − ε, assuming the Unique Games Conjecture (UGC). This is tight because the problem has an easy 2-approximation algorithm. Without resorting to the UGC, the best inapproximability result for ...

متن کامل

An Analysis of Convex Relaxations for MAP Estimation

The problem of obtaining the maximum a posteriori estimate of a general discrete random field (i.e. a random field defined using a finite and discrete set of labels) is known to be NP-hard. However, due to its central importance in many applications, several approximate algorithms have been proposed in the literature. In this paper, we present an analysis of three such algorithms based on conve...

متن کامل

Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations

We present a novel message passing algorithm for approximating the MAP problem in graphical models. The algorithm is similar in structure to max-product but unlike max-product it always converges, and can be proven to find the exact MAP solution in various settings. The algorithm is derived via block coordinate descent in a dual of the LP relaxation of MAP, but does not require any tunable para...

متن کامل

Linear Programming Relaxations and Belief Propagation - An Empirical Study

The problem of finding the most probable (MAP) configuration in graphical models comes up in a wide range of applications. In a general graphical model this problem is NP hard, but various approximate algorithms have been developed. Linear programming (LP) relaxations are a standard method in computer science for approximating combinatorial problems and have been used for finding the most proba...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017